1) Precondition: takes an image as input
2) Postconditions: returns a blurred image and the class name if the class is Nude, and returns the original picture with the class name otherwise.
3) Implementation: The function takes an image url or path as input, converts the image into a tensor, predicts the class as per the loaded model, and displays the picture and a corresponding message. If the picture is predicted as Nude, then the program displays a blurred image along with the message
from keras.preprocessing import image
from keras.models import Sequential
from PIL import ImageFilter, Image
import numpy as np
import matplotlib.pyplot as plt
import cv2
%matplotlib inline
### Convert image to RGB-valued tensor
def path_to_tensor(img_path):
img = image.load_img(img_path, target_size=(224, 224))
x = image.img_to_array(img)
return np.vstack(np.expand_dims(x, axis=0))
def create_model():
from keras.layers import Conv2D, MaxPooling2D, GlobalAveragePooling2D
from keras.layers import Dropout, Flatten, Dense, Activation
from keras.models import Sequential
# CNN with sigmoid
model = Sequential()
model.add(Conv2D(32, (3, 3), input_shape=(224, 224, 3)))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Conv2D(32, (3, 3)))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Conv2D(64, (3, 3)))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Flatten())
model.add(Dense(64, activation="relu"))
model.add(Dropout(0.3))
model.add(Dense(2, activation="sigmoid"))
return model
def print_image(img):
im = cv2.imread(img)
cv_rgb = cv2.cvtColor(im, cv2.COLOR_BGR2RGB)
imgplot = plt.imshow(cv_rgb)
def nude_detector(image_path):
# Open the image
im = Image.open(image_path)
# Convert the image to tensor
im_tensor = path_to_tensor(image_path).astype('float32')/255
# Create the model and load the optimal weights
model = create_model()
model.load_weights('saved_models/weights.best.finalCNN.hdf5')
# Make prediction
prediction = np.argmax(model.predict(np.expand_dims(im_tensor, axis=0)))
# Check the output
if prediction:
print_image(image_path)
print("This image is Safe")
return (im, "Safe")
else:
### Blur 10 times!
nude_image = im.filter(ImageFilter.GaussianBlur(radius=10))
for i in range(0,9):
nude_image = nude_image.filter(ImageFilter.GaussianBlur(radius=10))
nude_image.save("nude_temp.jpg")
print_image("nude_temp.jpg")
print("This image contains nudity!")
return (nude_image, "Nude")
nude_detector("data/app_test/safe/safe (1).jpg")
nude_detector("data/app_test/safe/safe (2).jpg")
nude_detector("data/app_test/safe/safe (3).jpg")
nude_detector("data/app_test/safe/safe (4).jpg")
nude_detector("data/app_test/safe/safe (5).jpg")
nude_detector("data/app_test/safe/safe (6).jpg")
nude_detector("data/app_test/safe/safe (7).jpg")
nude_detector("data/app_test/safe/safe (8).jpg")
nude_detector("data/app_test/safe/safe (9).jpg")
nude_detector("data/app_test/safe/safe (10).jpg")
As we can see, the algorithm does a good job detecting Safe images! 100% of our new samples were correctly classified as Safe!
nude_detector("data/app_test/ambiguous/amb (1).jpg")
nude_detector("data/app_test/ambiguous/amb (2).jpg")
nude_detector("data/app_test/ambiguous/amb (3).jpg")
nude_detector("data/app_test/ambiguous/amb (4).jpg")
nude_detector("data/app_test/ambiguous/amb (5).jpg")
nude_detector("data/app_test/ambiguous/amb (6).jpg")
nude_detector("data/app_test/ambiguous/amb (7).jpg")
nude_detector("data/app_test/ambiguous/amb (8).jpg")
nude_detector("data/app_test/ambiguous/amb (9).jpg")
nude_detector("data/app_test/ambiguous/amb (10).jpg")
As we can see, our algorithm classifies 80% of the ambiguously nude images as Nude. This result is consistent with the results we got earlier. (recall of algorithm = 81%)
nude_detector("data/app_test/nude/n (1).jpg")
nude_detector("data/app_test/nude/n (2).jpg")
nude_detector("data/app_test/nude/n (3).jpg")
nude_detector("data/app_test/nude/n (4).jpg")
nude_detector("data/app_test/nude/n (5).jpg")
nude_detector("data/app_test/nude/n (6).jpg")
nude_detector("data/app_test/nude/n (7).jpg")
nude_detector("data/app_test/nude/n (8).jpg")
nude_detector("data/app_test/nude/n (9).jpg")
nude_detector("data/app_test/nude/n (10).jpg")
As we can see, our algorithms classifies correctly all of the absolutely Nude pictures. Under no circumstances their prediction label should be Safe, and our algorithm does a good job detecting their nudity!
Here we test some cases that our algorithm fails to predict correctly the class of the image. Nevertheless, we observe that the false predictions are not necessarily false, but depends on who you ask. Some people have higher tolerance of nudity than others. However, the algorithm fails to generalize and detect nude and non nude features in the below cases:
nude_detector("data/app_test/false_pred/fp (1).jpg")
print("Original Picture")
print_image("data/app_test/false_pred/fp (2).jpg")
print("False Prediction!")
nude_detector("data/app_test/false_pred/fp (2).jpg")
print("False Prediction!")
nude_detector("data/app_test/false_pred/fp (3).jpg")
print("Original Picture")
print_image("data/app_test/false_pred/fp (4).jpg")
print("False Prediction!")
nude_detector("data/app_test/false_pred/fp (4).jpg")
print("False Prediction!")
nude_detector("data/app_test/false_pred/fp (5).jpg")
print("Original Picture")
print_image("data/app_test/false_pred/fp (6).jpg")
print("False Prediction!")
nude_detector("data/app_test/false_pred/fp (6).jpg")
print("False Prediction!")
nude_detector("data/app_test/false_pred/fp (7).jpg")
print("Original Picture")
print_image("data/app_test/false_pred/fp (8).jpg")
print("False Prediction!")
nude_detector("data/app_test/false_pred/fp (8).jpg")
print("Correct Prediction!")
nude_detector("data/app_test/false_pred/fp (9).jpg")
print("Original Picture")
print_image("data/app_test/false_pred/fp (10).jpg")
print("False Prediction!")
nude_detector("data/app_test/false_pred/fp (10).jpg")
As you can see, the last two pictures are both revealing skin however in the second to last case the algorithm does a good job recognizing it, but in the last case the algorithm generalizes the picture as Nude, just to be safe. This is not necessarily a bad thing for our purposes.
As a result, the algorithm does a good job overall and the mistakes that occur are oftentimes open to debate as per their class.